perm filename PANIC.SOS[4,KMC]1 blob
sn#016897 filedate 1972-12-08 generic text, type T, neo UTF8
00100 PROBLEMS OF NATURAL LANGUAGE UNDERSTANDING IN TELETYPED INTERVIEW DIALOGUES.
00200
00300
00400 By `natural language` I shall mean everyday American English
00500 such as is used by readers of this book in ordinary conversations.
00600 It is still difficult to be explicit about the processes which
00700 enable hummans to interpret and respond to natural language.
00800 Philosophers, linguists and psychologists have speculated about
00900 and investigated natural language with various purposes and few
01000 useful results. Now attempts are being made in artificial intelligence to write
01100 algorithims which `understand' what is being expressed in natural
01200 language utterances.
01300 During the 1960's when machine processing of natural language
01400 was dominated by syntactic considerations, it became clear that
01500 this approach was insufficient. The current view is that to unDerstand
01600 what utterances say, knowledGe about linguistic syntax and semantics
01700 must be combined with knowledge about an underlying conceptual
01800 structure containing a world-model and an ability to draw inferences.
01900 How to achieve this combination efficiently represents a huge task for
02000 both theory and implementation.
02100 Since the behavior being simulated by our paranoid model is the
02200 linguistic-conceptual behavior of paranoid patients in a psychiatric
02300 interview, the model must have some ability to process and respond to
02400 natural language input in a manner indicating the underlying pathological
02500
02600 to develop a method for understanding everyday Englisg sufficient
02700 for the model to behave conversationally in a paranoid way in a
02800 circumscribed situation. What is said in this situation is far
02900 icher than what is said in conversations with a block-stacking
03000 i2900
03100 robot but its requirements for constructing an interpretation
03200 of an input are not as complex as trying to understand anything
03300 said in English BY anybody in any dialogue situation.
03400 We took a pragmatic approach which considered "understanding"
03500 to represent "getting the message" of an utterance by
03600 gleaning some {not all} of the relations between them.
03700 this straightforward approach to a complex problem has its
03800 drawbacks, as will be shown, but we were striving for a
03900 sufficiency to demonstrate paranoia rather than complete
04000 comprehension of English.
04100 Linguistic approaches cite traditional problems with
04200 ambiguity, as illustrated in the following example from
04300 Wilks { }. Suppose I walked up to you, a stranger, on
04400 the street on Sunday morning and said
04500 {1} He fell while getting to the ball'
04600 Admittedly this is a strange scene and in this situation
04700 you would think me to be crazy, hungover and maybe still
04800 drunk, but the example is no more weird than the isolated
04900 examples discussed in the linguistics literature. Suppose
05000 further that in your personal `dictionary' the word 'ball'
05100 has at least two senses, {A} a spherical physical object
05200 used in a game, and {B} a formal dance. {It probably has
05300 also a third sense as a verb but we will ignore this more
05400 or less recent example of semantic shift}. Having no
05500 further information in this situation and attempting to
05600 construct an interpretation of my utterance, you would bbe
05700 puzzled as to whether I was referring to a ball game or a
05800 dance. If we then continued on our respective ways, saying
05900 nothing else, your puzzlement would continue and even increase
06000 --I don't know what he was referring to nor why he even said
06100 that to me.
06200
06300 The ambiguity arises because of the two word senses for
06400 ball, each of which would give the utterance a meaningful
06500 interpretation. But the example is extremely forced and
06600 artificial. Such isolated utterances cannot bbe disambiguated
06700 `uniqueated is a better term' but this is no handicap for
06800 ordinary human convversations in which ambiguities hardly arise
06900 at all. Besides the utterance itself, extra information is
07000 usually available in the form of contextual and situational
07100 knowledge, even bbetter, one can always ask. If I had said
07200 only utterance {1} to you, you could simply ask:
07300
07400 {2} `What do you mean?'
07500
07600 and my reply would indicate something about a game or a
07700 dance or who `he' was.
07800 Utterances occur in conversations which take place in
07900 sociopsychological situation. The communicants have roles
08000 and intentions towards one another. If the situation is that
08100 of a medical or psychiatric interview between doctor and
08200 patient and the doctor asks:
08210
08300 {3} `How much do you drink?'
08400 We know from the nature of the situation that drink means
08500 `drink alcohol' and does not refer to a total fluid intake.
08600
08700 Dialogues represent connected discourse in which all
08800 the utterances, except perhaps for opening greetings, are
08900 connectable to previous utterances, contexts and sub
09000 contexts, topics and subtopics surround any givven utterance
09100 and activate relevant word senses such that alternative
09200 senses do not arise in the comprehension process. In
09300 spoken dialogues intonations and word emphases are further
09400 means for avoiding ambbiguities, but the connected discourse
09500 of dialogues brings problems of its own to the algorithmist
09600 whose program must keep track of what is going on and what
09700 has been said before. Foremost is the problem of anaphoric
09800 reference.
13200
13300 Fragments
13400
13500 Another major problem for algorithms which attempt to understand
13600 discourse consists of the fact that many of the input expressions
13700 are not well-formed. All sorts of fragments and ellipses appear
13800 which must somehow be connected to conceptualizations under discussion.
13900 For example, consider the following exchange:
14000
14100 {10} Dr. - How do you like the hospital?
14200
14300 {11} Pt. - I shouldn't be here.
14400
14500 {12} Dr. - Why not?
14600
14700 The question {12} is an elliptical expression for the full conceptueliza
14800 tion
14900
15000 `Why should you not be in the hospital?'
15100
15200 Junk words {`well now'} {`tell me more'} and go ahead signals
15300 must be responded to by continuation of a topic.
15400
15500 For example:
15600
15700 {13} Pt. - I went to the track last week.
15800
15900 {14} Dr. - Really?
16000
16100 Such expressions as {14} stand in a meta-relation to the topic and
16200 serve to keep the conversation going.
16300
16400 Rejoinders
16500
16600 Sometimes the input expression from the interviewer is a rejoinder
16700 , a reply to a reply by the patient. For instance:
16800
16900 {15} Dr. - Who are you afraid of?
17000
17100 {16} Pt. - The Mafia is out to get me.
17200
17300 {17} Dr. - I would be afraid of them also.
17400
17500 in which {17} is a rejoinder. Such expressions are not requests for
17600 information but provide information for the patient's model of the
17700 interviewer.
17800
17900 Interviewer-interviewee Relations
18000
18100
18200 It is characteristic of psychiatric interviewing that the
18300 participants from time to time do not simply talk about the
18400 patient. Two situations exist concurrently in an interview,
18500 one being talked about and one the participants are in. At
18600 times the second situation becomes the first. When the partici
18700 pants discuss one another and their relation, the dialogue
18800 expressions contain intentional verbs which in English fit the
18900 pattern `I X you' or `you X me'.The comprehension process must
18950
19000 distinguish clearly between subject and object in the case of some
20000 of these verbs. For example in
20100
20200 {18} I like you
20300
20400 the speaker `I' experiences the liking but in
20500
20600 {19} Do I please you?
20700
20800 the `you' experiences the pleasure as a consequence of something
20900 `I' does.